American Imperialism

Carter Paul and Mat Lau
external image IMPERIALISM.jpgexternal image american-flag-2a.jpg
Imperialism is entrenched in American culture, dating back to as far as the pilgrims landing in foreign lands, only to soon conquor it. It is what defines us as American, and it has almost singlehandedly brought us to become the world power that we are today; But, what is imperialism really?

Imperialism: Noun-
The policy of extending the rule or authority of an empire or nation over foreign countries, or of acquiring and holding colonies and dependencies
  1. Economic Imperialismexternal image ExxonJima.jpg
  • Taking control of foreign lands in order to utilize the natural resources/raw materials it contains to benefit the imperialising country
2. Idealism and Manifest Destinyexternal image westwardho.jpg
  • A feeling of superiority that, it is our duty as American's to bring light, civility, and order to the darkest places of the world
  • "The White Man's Burden"

Take up the White Man's burden--
The savage wars of peace--
Fill full the mouth of Famine
And bid the sickness cease;
And when your goal is nearest
The end for others sought,
Watch sloth and heathen Folly
Bring all your hopes to nought.

3. Militaristic Imperialism
  • Imperializing because of significance to the military, mainly due to location

external image 01hi-statemap-usgs.jpg
The Monroe Doctrine (1823):
  • Established that any external involvement with America's imperialistic actions would be seen as an aggressive action that requires America's retaliation
  • Kept other countries out of American affairs
  • Made America unstoppable vs minor countries/territories

Imperialism is the basis of American thought in the past, for it is a kill or be killed world, and America intends to survive. The following wikispaces describe numerous occasions in which America has imperialized, ranging from the Louisiana Purchase of 1803 to the annexation of Hawaii in 1893.